Being late to the AI-Party is good for Apple
What’s the deal with AI?
Recently, Google released a new AI feature that is similarly mature as Bing’s AI search was. At the top of some search results, Google now shows an “AI Overview” meant to answer the search query without the user having to open any additional website. These overviews include information from Reddit, like adding non-toxic glue to pizza to make the cheese stick to it better—which the AI probably got from this post.
This incident and many others demonstrate once more that what we call “AI” is not intelligent, and we should not call it AI but LLM (Large Language Model). In short, LLMs compress the data they are trained on into an interactive model that will reproduce information compressed into it, as well as information contained in its current context.
In plain English: LLMs are computer algorithms that construct grammatically correct text. They also understand that “a river bank” is not a financial institution and other limited contextual clues, but that’s about it. Granted, LLMs are getting better, but right now an LLM will ingest sarcastic and humorous content the same way it ingests factual content. The results are suggestions such as adding non-toxic glue to your pizza to make the cheese stick.
Why not having any is a good thing
For a few years now, commentators online have speculated when Apple will add AI to their products. What they mean is: When will Apple add flashy generative AI to its products? Because Apple’s products already utilize AI, not LLMs, for many tasks.
Examples
-
You’ve been able to find cats in your photos for years by typing “cat” in the search box on your iPhone.
-
If you listen to audiobooks to fall asleep, your phone has probably been suggesting you resume an audiobook when you go to bed for a while.
-
Typing on any modern Apple device became a collaborative experience with the updates last year when Apple added Predictive Text to iOS, iPadOS, and macOS. Note that Predictive Text is not the suggestions shown above the keyboard on your phone but the gray text that often appears on the right side of your cursor.
This last example is a very non-flashy use of generative AI / an LLM. It’s very restrained, very controlled, and perfectly within Apple’s comfort zone. It won’t try to answer questions or write an entire e-mail for you, yet it is pretty helpful, especially when typing on a small screen like an iPhone.
I think Apple is doing its users a favor by adding AI LLMs and ML (Machine Learning) in such an understated and non-flashy fashion to its products. The commentator’s cries for AI on Apple’s devices are, in my opinion, short-sighted and unreasonable. No, Apple should not add an AI that generates entire E-Mails, hallucinates fake news, or generates images with questionable copyright status. I think Apple is striking just the right balance on AI, LLMs, and ML on its products.
Related Reading:
- Publishing AI Slop is a choice by John Gruber
- Google’s A.I. Answers Said to Put Glue in Pizza, So Katie Notopoulos Made Some Pizza by Nick Heer